A Quadratically Convergent Newton Method for Computing the Nearest Correlation Matrix
نویسندگان
چکیده
The nearest correlation matrix problem is to find a correlation matrix which is closest to a given symmetric matrix in the Frobenius norm. The well-studied dual approach is to reformulate this problem as an unconstrained continuously differentiable convex optimization problem. Gradient methods and quasi-Newton methods such as BFGS have been used directly to obtain globally convergent methods. Since the objective function in the dual approach is not twice continuously differentiable, these methods converge at best linearly. In this paper, we investigate a Newton-type method for the nearest correlation matrix problem. Based on recent developments on strongly semismooth matrix valued functions, we prove the quadratic convergence of the proposed Newton method. Numerical experiments confirm the fast convergence and the high efficiency of the method.
منابع مشابه
A Quadratically Convergent Interior-Point Algorithm for the P*(κ)-Matrix Horizontal Linear Complementarity Problem
In this paper, we present a new path-following interior-point algorithm for -horizontal linear complementarity problems (HLCPs). The algorithm uses only full-Newton steps which has the advantage that no line searchs are needed. Moreover, we obtain the currently best known iteration bound for the algorithm with small-update method, namely, , which is as good as the linear analogue.
متن کاملA Sequential Semismooth Newton Method for the Nearest Low-rank Correlation Matrix Problem
Based on the well-known result that the sum of the largest eigenvalues of a symmetric matrix can be represented as a semidefinite programming problem (SDP), we formulate the nearest low-rank correlation matrix problem as a nonconvex SDP and propose a numerical method that solves a sequence of least-square problems. Each of the least-square problems can be solved by a specifically designed semis...
متن کاملA preconditioned Newton algorithm for the nearest correlation matrix
Various methods have been developed for computing the correlation matrix nearest in the Frobenius norm to a given matrix. We focus on a quadratically convergent Newton algorithm recently derived by Qi and Sun. Various improvements to the efficiency and reliability of the algorithm are introduced. Several of these relate to the linear algebra: the Newton equations are solved by minres instead of...
متن کاملA Majorized Penalty Approach for Calibrating Rank Constrained Correlation Matrix Problems
In this paper, we aim at finding a nearest correlation matrix to a given symmetric matrix, measured by the componentwise weighted Frobenius norm, with a prescribed rank and bound constraints on its correlations. This is in general a non-convex and difficult problem due to the presence of the rank constraint. To deal with this difficulty, we first consider a penalized version of this problem and...
متن کاملQuadratic Convergence and Numerical Experiments of Newton’s Method for Computing the Nearest Correlation Matrix
The nearest correlation matrix problem is to find a correlation matrix which is closest to a given symmetric matrix under the Frobenius norm. The well studied dual approach is to reformulate this problem as an unconstrained continuously differentiable convex optimization problem. Gradient methods and quasi-Newton methods like BFGS have been used directly to obtain globally convergent methods. S...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
- SIAM J. Matrix Analysis Applications
دوره 28 شماره
صفحات -
تاریخ انتشار 2006